Goto

Collaborating Authors

 langevin algorithm


Double Randomized Underdamped Langevin with Dimension-Independent Convergence Guarantee Y uanshi Liu, Cong Fang, Tong Zhang School of Intelligence Science and Technology, Peking University

Neural Information Processing Systems

Sampling from a high-dimensional distribution serves as one of the key components in statistics, machine learning, and scientific computing, and constitutes the foundation of the fields including Bayesian statistics and generative models [Liu and Liu, 2001, Brooks et al., 2011, Song et al.,


Double Randomized Underdamped Langevin with Dimension-Independent Convergence Guarantee Y uanshi Liu, Cong Fang, Tong Zhang School of Intelligence Science and Technology, Peking University

Neural Information Processing Systems

Sampling from a high-dimensional distribution serves as one of the key components in statistics, machine learning, and scientific computing, and constitutes the foundation of the fields including Bayesian statistics and generative models [Liu and Liu, 2001, Brooks et al., 2011, Song et al.,



Efficientconstrainedsamplingviathe mirror-Langevinalgorithm

Neural Information Processing Systems

The sampling problem has attracted considerable attention recently within the machine learning and statistics communities. This renewed interest in sampling is spurred, on one hand, by a wide breadth of applications ranging from Bayesian inference [RC04, DM+19] and its use in inverse problems [DS17], to neural networks [GPAM+14, TR20].



QuantumAlgorithmsforSamplingLog-Concave DistributionsandEstimatingNormalizingConstants

Neural Information Processing Systems

Given a convex function f: Rd R, the problem of sampling from a distribution e f(x) is called log-concave sampling. This task has wide applications in machine learning, physics, statistics, etc.